asymptotic complexity - significado y definición. Qué es asymptotic complexity
Diclib.com
Diccionario ChatGPT
Ingrese una palabra o frase en cualquier idioma 👆
Idioma:

Traducción y análisis de palabras por inteligencia artificial ChatGPT

En esta página puede obtener un análisis detallado de una palabra o frase, producido utilizando la mejor tecnología de inteligencia artificial hasta la fecha:

  • cómo se usa la palabra
  • frecuencia de uso
  • se utiliza con más frecuencia en el habla oral o escrita
  • opciones de traducción
  • ejemplos de uso (varias frases con traducción)
  • etimología

Qué (quién) es asymptotic complexity - definición

MEASURE OF THE AMOUNT OF RESOURCES NEEDED TO RUN AN ALGORITHM OR SOLVE A COMPUTATIONAL PROBLEM
Asymptotic complexity; Computational Complexity; Bit complexity; Context of computational complexity; Complexity of computation (bit); Computational complexities

Asymptotic computational complexity         
IN THEORY OF COMPUTATION
Asymptotic time complexity; Asymptotic space complexity
In computational complexity theory, asymptotic computational complexity is the usage of asymptotic analysis for the estimation of computational complexity of algorithms and computational problems, commonly associated with the usage of the big O notation.
Computational complexity         
In computer science, the computational complexity or simply complexity of an algorithm is the amount of resources required to run it. Particular focus is given to computation time (generally measured by the number of needed elementary operations) and memory storage requirements.
computational complexity         
<algorithm> The number of steps or arithmetic operations required to solve a computational problem. One of the three kinds of complexity. (1996-04-24)

Wikipedia

Computational complexity

In computer science, the computational complexity or simply complexity of an algorithm is the amount of resources required to run it. Particular focus is given to computation time (generally measured by the number of needed elementary operations) and memory storage requirements. The complexity of a problem is the complexity of the best algorithms that allow solving the problem.

The study of the complexity of explicitly given algorithms is called analysis of algorithms, while the study of the complexity of problems is called computational complexity theory. Both areas are highly related, as the complexity of an algorithm is always an upper bound on the complexity of the problem solved by this algorithm. Moreover, for designing efficient algorithms, it is often fundamental to compare the complexity of a specific algorithm to the complexity of the problem to be solved. Also, in most cases, the only thing that is known about the complexity of a problem is that it is lower than the complexity of the most efficient known algorithms. Therefore, there is a large overlap between analysis of algorithms and complexity theory.

As the amount of resources required to run an algorithm generally varies with the size of the input, the complexity is typically expressed as a function nf(n), where n is the size of the input and f(n) is either the worst-case complexity (the maximum of the amount of resources that are needed over all inputs of size n) or the average-case complexity (the average of the amount of resources over all inputs of size n). Time complexity is generally expressed as the number of required elementary operations on an input of size n, where elementary operations are assumed to take a constant amount of time on a given computer and change only by a constant factor when run on a different computer. Space complexity is generally expressed as the amount of memory required by an algorithm on an input of size n.